Evolution, Medicine, and Public Health
◐ Oxford University Press (OUP)
Preprints posted in the last 7 days, ranked by how well they match Evolution, Medicine, and Public Health's content profile, based on 14 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Strand, P. S.; Trang, J. C.
Show abstract
Female genital cutting (FGC) is identified within global health and human rights discourse as aligned with gender inequality and female disempowerment. The persistence of FGC in high-prevalence societies is assumed to reflect womens limited influence over decisions concerning their daughters. Yet anthropological research has questioned whether this interpretation adequately reflects how FGC is organized within practicing communities. Across two studies with 176,728 participants from 15 African and Asian countries, we examine whether mothers attitudes toward FGC predict daughters circumcision status and whether this relationship varies with regional FGC prevalence. Multilevel logistic regression models show that maternal attitudes strongly predict daughter circumcision status across both datasets. Contrary to expectations derived from disempowerment frameworks, the association between maternal attitudes and daughter outcomes is not weaker in high-prevalence contexts, it is stronger. These findings suggest that interpretations of FGC as reflecting female disempowerment may mischaracterize the social dynamics of societies in which FGC is common. Policy implications of the findings are discussed.
Xiao, M.; Girard, Q.; Pender, M.; Rabezara, J. Y.; Rahary, P.; Randrianarisoa, S.; Rasambainarivo, F.; Rasolofoniaina, O.; Soarimalala, V.; Janko, M. M.; Nunn, C. L.
Show abstract
PurposeAntibiotic use (ABU) is a major driver of antimicrobial resistance (AMR), but ABU patterns are poorly understood in low-income countries where the burden of AMR is great and ABU is insufficiently regulated. Here, we report ABU from ten sites ranging from rural villages to small cities in Madagascar, a country with high AMR levels, and present results from modeling to identify factors that may be associated with ABU in this setting. MethodsWe conducted surveys of 290 individuals from ten sites in the SAVA Region of northeast Madagascar to gather data on sociodemographic characteristics, agricultural and animal husbandry practices, recent antibiotic use, the antibiotics that participants recalled using in their lifetimes, and the sources of their antibiotics. Using these data, we conducted statistical analyses with a mixed-effects logistic model to determine which characteristics were associated with recent antibiotic use. ResultsNearly all respondents (N=283, 97.6%) reported ABU in their lifetimes, with amoxicillin being the most widely reported antibiotic (N=255, 90.1% of those reporting ABU). All recalled antibiotics were classified as frontline drugs except for ciprofloxacin. Most respondents who reported antibiotic use also reported obtaining antibiotics without prescriptions from local stores (N=273, 96.5%), while only 52.3% (N=148) reported obtaining antibiotics through a prescriptive route, such as from a health clinic or private doctor. Of the 127 individuals (44.9%) who reported recent ABU, men were found to be significantly less likely to have recently taken antibiotics than women. ConclusionsOur findings provide new insights into ABU in agricultural settings in low-income countries, which have historically been understudied in AMR and pharmacoepidemiologic research. Knowledge of ABU patterns supports understanding of AMR dynamics and AMR control efforts in these contexts, such as interventions on inappropriate antibiotic dispensing. Key pointsO_LIAntibiotic use (ABU) in Madagascar is largely unstudied despite its role in antimicrobial resistance (AMR), which Madagascar faces a high burden of. C_LIO_LIABU was widespread among livestock owners in northeast Madagascar, with the majority of study participants reporting ABU in their lifetimes and most people reporting ABU also having taken antibiotics in the previous three months. C_LIO_LIMost respondents reported obtaining their antibiotics from non-pharmaceutical stores, indicating high levels of unregulated ABU, though more than half also reported sourcing their antibiotics through prescriptive means (like doctors and health clinics). C_LIO_LIMen were less likely than women to have taken antibiotics in the previous three months. C_LIO_LIThese findings support the development of interventions to mitigate the burden of AMR in Madagascar and similar contexts while underscoring the need for more comprehensive research on the drivers and patterns of ABU. C_LI Plain language summaryIn this study, we provide basic information on antibiotic use (ABU) patterns in Madagascar, a country that experiences high levels of resistance but has been particularly understudied in AMR and pharmacological research. We surveyed 290 farmers with livestock from ten sites across northeast Madagascar about their ABU and found that nearly all study participants (N=283, 97.6%) have used antibiotics in their lifetimes, while a little under half of those who reported ABU also reported using antibiotics in the previous three months (N=127, 44.9%). The most used antibiotic was amoxicillin (N=255, 90.1%). Most people obtained their antibiotics from sources that do not require prescriptions, like general stores, indicating that most ABU is unregulated. Through modeling, we also found that men were less likely than women to have taken antibiotics in the previous three months (OR=0.50, CI 0.30-0.82). These findings help us better understand the dynamics of ABU in low-income countries, which have historically been understudied in AMR and pharmacological research. They also support efforts to mitigate the burden of AMR by revealing ABU dynamics that may contribute to the emergence and spread of AMR, as well as identifying targets for intervention to curb inappropriate ABU.
Vaportzis, E.; Edwards, W.
Show abstract
This study investigated retirement adjustment in retired police officers in the UK (N = 289), examining how time since leaving the service moderates the relationship between perceived organisational support and retirement adjustment while accounting for resilience. Results indicated a developmental trend: organisational support remains stable initially but becomes increasingly influential in later life. Using Johnson-Neyman analysis, a threshold of 32.07 years was identified, after which the association reaches statistical significance. These findings suggest an organisational legacy effect; for the older generation, the retrospective perception of being valued by the service acts as a durable psychological resource. This study offers a novel conceptualisation of long-term organisational influence by identifying a temporally delayed legacy effect that extends beyond existing models of retirement adjustment. The study advocate for lifelong wellbeing strategies that extend, recognising that the organisational relationship continues to shape adjustment outcomes decades after the conclusion of active duty.
Moradi Marjaneh, M.; Badhan, A.; Chai, H.; Hadfield, O.; Chen, Y.; Wang, Z.; Thomson, E. C.; Taylor, G. P.; Walker, A. S.; Ansari, M. A.; Barnes, E.; Cooke, G. S.
Show abstract
Background: Ribavirin is a guanosine analogue with clinical antiviral activity against a range of RNA viruses including hepatitis C virus (HCV), respiratory syncytial virus and Lassa virus. Several potential mechanisms of action have been proposed, but there is limited data supporting them clinically. Methods: We studied 196 HCV-infected participants from a trial of short-course directly antiviral therapy (STOPHCV-1) which included a factorial randomisation to ribavirin versus no ribavirin. Deep sequencing of the HCV genome was performed on samples with detectable viremia from three time-points: baseline (n = 191), day 3 of treatment (n = 25) and post-treatment failure (n = 47). Results: Ribavirin exposure significantly increased total mutational load at treatment failure (P = 0.0065) and enriched classical ribavirin-associated transitions, including G->A (P = 0.026) and C[->]U (P = 0.004), along with other key changes including A->G (P = 0.005), U->C (P = 0.023), C->G (P = 0.010), and U->A (P = 0.026). The resulting mutational signature was broad, not dominated by G-related changes. Region-specific analyses demonstrated this increase was broadly distributed across the viral genome, without strong evidence for protection of specific regions. Non-synonymous to synonymous mutation ratios (dN/dS) rose at day 3 (P = 5.5e-5) before declining at failure (P = 8.5e-7), with trends toward higher dN/dS in the ribavirin group at day 3 (P = 0.06). Conclusions: Ribavirin acts as a potent in vivo mutagen, driving viral populations toward genome-wide diversity rather than selecting a few highly fit drug-resistant clones. These findings support an error-catastrophe model.
Timonina, V.; Fellay, J.; the Swiss HIV Cohort Study (SHCS),
Show abstract
Clonal hematopoiesis of indeterminate potential (CHIP) is an age-associated condition linked to chronic inflammation and an increased risk of cardiovascular diseases and hematological malignancies. People with HIV (PWH) exhibit a higher prevalence of CHIP than the general population, but the mechanisms underlying this association remain unclear. In particular, it is unknown whether the excess burden of CHIP reflects earlier emergence of mutant clones, altered clonal expansion dynamics, or differences in selective pressures acting on hematopoietic stem cells. We reconstructed longitudinal trajectories of CHIP variant allele frequency (VAF) in 52 PWH using serial peripheral blood samples spanning up to 25 years from the Swiss HIV Cohort Study. We used spline-based modelling to estimate clone size and growth dynamics, and dynamic time warping to identify common trajectory patterns. Associations between clonal dynamics and longitudinal immune parameters were assessed using linear mixed-effects models. Trajectories in PWH were compared with publicly available longitudinal CHIP data from the SardiNIA population cohort. We identified heterogeneous clonal dynamics consistent with known gene-specific fitness patterns. Larger clone size was associated with lower CD4 T-cell count and lower CD4/CD8 ratio. Compared with the general population cohort, PWH showed higher VAF across the observed age range and steeper early trajectory increases, while long-term expansion rates were broadly similar. Greater variability in clonal dynamics among PWH suggests a stronger contribution of host environmental factors to clonal fitness. These findings support a model in which HIV-associated immune dysregulation alters the hematopoietic fitness landscape, contributing to earlier detectable clonal expansion and increased burden of CHIP in PWH.
Hayford, C. E.; Baleami, B.; Stauffer, P. E.; Paudel, B. B.; Al'Khafaji, A.; Brock, A.; Quaranta, V.; Tyson, D. R.; Harris, L. A.
Show abstract
Drug-tolerant persisters (DTPs) represent a major obstacle to durable responses in targeted cancer therapy. DTPs are commonly described as distinct single-cell states that survive drug treatment via reversible, non-genetic mechanisms and drive tumor recurrence. Recent work demonstrates that multiple DTPs can coexist, reflecting diversity in lineage, signaling programs, or stress responses. However, each DTP is still generally viewed as a uniform cellular phenotype. Building on our prior work describing a population-level DTP termed "idling" [Paudel et al., Biophys. J. (2018) 114, 1499-1511], here we present evidence supporting a fundamentally different view: that DTPs are not single-cell states, but rather heterogeneous populations composed of multiple sub-states with distinct division and death rates that balance to produce near-zero net population growth. Using single-cell transcriptomics and lineage barcoding, we identify multiple phenotypic states within idling DTP populations, with reduced heterogeneity compared to untreated populations, and find that idling DTP cells emerge from nearly all lineages. Transcriptomic and functional analyses further reveal altered ion-channel activity in idling DTPs, which we confirm experimentally. Moreover, drug-response assays reveal increased susceptibility of idling DTPs to ferroptosis, a non-apoptotic form of regulated cell death, indicating the emergence of vulnerabilities associated with drug tolerance. Altogether, our results support a population-level view of tumor drug tolerance in which DTPs comprise stable collections of phenotypic states, shaped by treatment-defined phenotypic landscapes, which are potentially vulnerable to subsequent interventions. This perspective implies that eradicating DTPs will require a fundamental shift away from cell-type-centric strategies toward sequential treatments that progressively reduce phenotypic heterogeneity by modulating the molecular and cellular processes that establish the DTP landscape, an approach previously termed "targeted landscaping."
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
Salim, A.; Allen, M.; Mariki, K.; Pallangyo, T.; Maina, R.; Mzee, F.; Minja, M.; Msovela, K.; Liana, J.
Show abstract
In the context of global health, the ability of frontline primary health providers to identify potential Drug-Drug Interactions (DDIs) is a critical component of patient safety. This is particularly true in settings like Tanzania, where drug dispensers often serve as the primary point of contact for healthcare. In this study, we establish a baseline for drug decision-making capabilities across multiple cadres of healthcare providers in Kibaha, Tanzania. We specifically distinguish between the ability to recognize safe drug combinations versus harmful ones. The findings reveal a critical asymmetry in provider performance: while professional training improves the recognition of safe combinations, it provides no advantage over lay intuition (and in some cases, a significant disadvantage) in detecting potentially harmful interactions.
Haeusler, I. L.; Etoori, D.; Campbell, C. N. J.; McDonald, S. L. R.; Lopez Bernal, J.; Mounier-Jack, S.; Kasstan-Dabush, B.; McDonald, H. I.; Parker, E. P. K.; Suffel, A.
Show abstract
BackgroundIn England, individuals with chronic liver disease (CLD) are among those with the lowest seasonal influenza vaccine uptake despite being at elevated risk of severe influenza. We examined the relationship between CLD severity and aetiology, and influenza vaccine uptake in England. MethodsA retrospective cohort study of adults (18-115 years) using Clinical Practice Research Datalink Aurum primary care data was conducted for five seasons (2019/20-2023/24). Poisson regression was used to estimate rates of uptake by CLD severity (clinical diagnoses categorised as low, moderate, or severe) and aetiology (alcohol-related, viral-related, and diagnoses in the Green Book guidelines). FindingsThere were 182,174-277,470 with CLD per cohort. Among those who were additionally age-eligible for vaccination, uptake was 71{middle dot}1-79{middle dot}7% compared to 30{middle dot}9-40{middle dot}5% in those not additionally age-eligible. Among individuals below age eligibility without other comorbidities, severity was associated with higher uptake (incidence rate ratio [IRR] moderate 1{middle dot}80, 95% CI 1{middle dot}69-1{middle dot}90; severe 1{middle dot}95, 95% CI 1{middle dot}84-2{middle dot}08 in 2023/24); there was no effect in those with at least one additional comorbidity (moderate 1{middle dot}05, 95% CI 0{middle dot}99-1{middle dot}10; severe 1{middle dot}05, 95% CI 1{middle dot}01-1{middle dot}09). Alcohol- and viral-related aetiology were also associated with increased uptake in those not additionally age-eligible. Among individuals meeting age eligibility without additional comorbidities, severity was associated with a reduced uptake (moderate 0{middle dot}81, 95% CI 0{middle dot}73-0{middle dot}90; severe 0{middle dot}79, 95% CI 0{middle dot}74-0{middle dot}85), with attenuation in those with additional comorbidities (moderate 0{middle dot}99, 95% CI 0{middle dot}94-1{middle dot}04; severe 0{middle dot}91, 95% CI 0{middle dot}89-0{middle dot}94). InterpretationCLD severity and aetiology were important determinants of uptake in the absence of additional indications for influenza vaccination. Future research should prioritise understanding facilitators and barriers to vaccine uptake in individuals with CLD, particularly for those at highest risk of severe infection. FundingNIHR Health Protection Research Unit in Vaccines and Immunisation (NIHR200929/NIHR207408). Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed up to June 2025 using the terms "chronic liver disease", "cirrhosis", "hepatitis", "influenza vaccination", "seasonal influenza", and "vaccine uptake". Previous research, including national data from England, has shown that people with chronic liver disease tend to have lower seasonal influenza vaccine uptake than individuals with other medical comorbidities which qualify for vaccination such as diabetes, chronic kidney disease or immunosuppression. The reasons for low influenza vaccine uptake in people with chronic liver disease are not well understood, and it is therefore difficult for vaccination providers, principally primary care services in England, to tailor interventions aimed to increase uptake. Qualitative research involving individuals aged less than 65 years living in England with clinical risk comorbidities, most commonly diabetes, found that chronic disease management pathways inconsistently provided information about the importance of influenza vaccination as part of chronic disease management. Individuals with long-term conditions reported low perceived risk of influenza infection and limited awareness of vaccine benefits as important reasons for non-uptake. We hypothesised that the severity and aetiology of chronic liver disease may be important determinants of uptake. Added value of this studyWe conducted a population-based study to examine how chronic liver disease severity and aetiology influence seasonal influenza vaccine uptake in adults in England. Using primary care electronic health record data from five consecutive influenza seasons (2019/20-2023/24), we found that more severe chronic liver disease was associated with a substantial increase in vaccine uptake in those without additional indications for seasonal influenza vaccination (age-based eligibility or other qualifying clinical risk comorbidities). Alcohol- and viral-related aetiology were also associated with increased uptake in those who were not additionally age-eligible for vaccination. In contrast, severity, alcohol- and viral-related underlying aetiology were associated with a modest reduction in uptake for individuals with chronic liver disease who also qualified for vaccination due to age. Implications of all the available evidenceDespite clear clinical vulnerability to infection and a substantially elevated risk of morbidity and mortality following infection, a large proportion of adults with chronic liver disease, particularly those aged under 65 years, remain unvaccinated against seasonal influenza each year. This study suggests that chronic liver disease severity and underlying aetiology are important determinants of uptake in individuals not meeting age-based vaccine eligibility, particularly in those without additional clinical risk comorbidities. This could be because of differing perceptions of influenza risk, or due to varying degrees of interaction with healthcare specialists as part of chronic disease management. In individuals who met age-based vaccination eligibility, the negative effect of severity on influenza vaccine uptake may reflect greater barriers to accessing vaccination services by those with more complex health needs, or competing medical priorities for long-term condition management during consultations. To inform targeted vaccination strategies, future research should aim to understand the specific facilitators and barriers to influenza vaccination experienced by individuals with chronic liver disease. This should include perspectives of individuals with different disease severity, across different age groups, in those with and without additional co-morbidities.
Areb, M.; Huybregts, L.; Tamiru, D.; Toure, M.; Biru, B.; Fall, T.; Haddis, A.; Belachew, T.
Show abstract
BackgroundThis study aimed to assess caregiver knowledge of Infant and Young Child Feeding (IYCF), child health, severe acute malnutrition (SAM) screening, and Community-Based Management of Acute Malnutrition (CMAM), its determinants, and associations with IYCF/ WaSH (water, sanitation, and hygiene) practices among caregivers of children 6-59 months with SAM in Ethiopian agrarian and pastoralist settings. MethodData were from the baseline survey of the R-SWITCH Ethiopia cluster-randomized controlled trial (cRCT), which screened [~]28,000 children aged 6-59 months and identified 686 SAM cases. Caregiver knowledge was evaluated using a validated 32-item questionnaire (Cronbachs for internal reliability) and analyzed via linear mixed-effects and Poisson regression models in Stata 17. ResultsCaregiver knowledge was positively associated with improved IYCF/WaSH practices among children aged 6-23 months with SAM, including higher minimum dietary diversity (MDD: IRR=1.50), minimum acceptable diet (MAD: IRR=1.63), and reduced zero vegetable/fruit intake (IRR=0.77), as well as MDD in children aged 24-59 months, improved water access (IRR=1.19), water treatment (IRR=2.02), and handwashing stations (IRR=1.41). Literate ({beta} = 4.1; 95% CI:1.5-6.6, p= 0.016), pregnant({beta} = 4.4; 95% CI:0.9-7.8, 0.018), having child weighing at a health post/ health center ({beta} = 8.9;95% CI:3.5-14.2,p [≤] 0.001), and higher household wealth index ({beta} = 11.8;95% CI:3.6-20.1,p= 0.005) were associated with higher knowledge, while possible depression ({beta} = -0.3;95% CI: -0.5 to 0.0, p= 0.015) was associated with lower knowledge. ConclusionCaregiver knowledge determines better IYCF/WaSH practices among children aged 6-59 months with SAM. Literacy, pregnancy, having child weighing at a health post or health center, and greater household wealth were associated with caregivers knowledge, whereas possible depression was associated with lower knowledge. Integrating context-specific caregiver education and mental health support into CMAM, GMP(Growth monitoring and promotion), and primary care services could enhance feeding/WaSH practices in Ethiopia.
Osman, M.; Ashwin, H.; Calder, G.; O'Toole, P.; Bakhiet, S. M.; Musa, A. M.; Kaye, P. M.; Fahal, A. H.
Show abstract
Mycetoma is a neglected tropical disease caused by various bacterial and fungal pathogens that has a significant health impact across a broad geographically defined "mycetoma belt" spanning South America, Africa and Asia. Histologically, mycetoma is characterised by invasive and destructive granuloma development in the skin, deep tissues and bone, leading to tissue destruction, deformities and high morbidity. The presence of macroscopic, highly compacted pathogen microcolonies, or "grains," is a key diagnostic feature, and the formation of grains supports pathogen persistence and disease chronicity. However, there is a paucity of information on immune responses in mycetoma patients and on the relative importance of phylogeny and/or grains in establishing the local immune landscape. Here, we used spatial proteomics to examine the distribution of 43 immune-related proteins in surgical biopsies from 11 patients with mycetoma of bacterial (Actinomycetoma; Actinomadura pelletierii and Streptomyces somaliensis; n=6) and fungal (Eumycetoma; Madurella mycetomatis; n=5) origin. Using mixed-effects modelling, an exploratory analysis across species and pathogen classes revealed few significant differences in immune marker expression. In contrast, and independently of pathogen class, the cellular infiltrate closest to grain boundaries had higher per-cell expression of CD66b+, ARG1, and VISTA. The preferential accumulation of CD66b+ARG1+VISTA+ cells at grain boundaries was confirmed by quantitative immunofluorescence analysis. Hence, the local tissue microenvironment surrounding the mycetoma grain represents a specialised immunosuppressive niche, with parallels to the tumour microenvironment.
Tampubolon, G.
Show abstract
Population ageing increases the importance of cognitive capacity for making decisions about retirement and living independently beyond it. We tested whether post-war educational expansion and working-life social mobility eliminate the association between social class of origin and cognition in early old age using the 1958 National Child Development Study. Two outcomes were analysed at age 62: standard episodic memory (immediate + delayed word recall) and long-term episodic memory, capturing accurate half-century recall of childhood household facts (rooms and people at age 11 validated against mothers' responses). Social mobility trajectories derived in prior work were classified into predominantly manual versus non-manual class trajectories. Models were estimated separately for women and men across three specifications: (i) social origin and controls, (ii) adding social mobility, and (iii) adding weighting to address healthy survivor bias. Education was consistently associated with both outcomes. For long-term episodic memory, social origin gradients were clearer than for short-term episodic memory, with men from service/professional origins showing a 13 percentage-point higher probability of accurate half-century recall than men from manual origins. These findings indicate that education expansion and working-life social mobility failed to release the grip of social origin on long-term episodic memory.
Asplin, P.; Mancy, R.; Keeling, M. J.; Hill, E. M.
Show abstract
Symptom propagation occurs when the symptoms of secondary cases are related to those of the primary case as a result of epidemiological mechanisms. Determining whether - and to what extent - symptom propagation occurs requires data-driven methods. Here we quantify the strength of symptom propagation as the increase in risk of a secondary case developing severe symptoms if the primary case has severe symptoms. We first used synthetic results to determine the data requirements to robustly estimate the strength of symptom propagation and to investigate the effect of severity-dependent reporting bias. Categorising symptom severity into two group (mild or severe; asymptomatic or symptomatic), our estimation requires only four summary statistics - the number of primary-secondary case pairs of each combination of symptom presentations. Our analysis showed that a relatively small number (100) of synthetic primary-secondary case pairs was sufficient to obtain a reasonable estimate of the strength of symptom propagation and 1,000 pairs meant errors were consistently small across replicates. Our estimates were robust to severity-dependent reporting bias. We also explored how symptom propagation can be separated from other individual-level factors affecting severity, using age dependence as an example. Although synthetic data generated from an age-structured model led to overestimations of the strength of symptom propagation, allowing disease severity to be age-dependent restored the accuracy of parameter estimation. Finally, we applied our methodology to estimate the strength of symptom propagation from three publicly available data collected during the COVID-19 pandemic with data on presence or absence of symptoms: England households, Israel households, and Norway contact tracing. Our age-free methodology indicated a 12-17% increase in the risk of being symptomatic if infected by someone symptomatic. Our positive estimates for the strength of symptom propagation persisted when applying our age-dependent methodology to the two household data sets with age-structured information (England and Israel). These findings demonstrate evidence for symptom propagation of SARS-CoV-2 and provide consistent estimates for its strength. Our synthetic data analysis supports the conclusion that these correlations are not a result of reporting bias or age-dependent effects. This work provides a practical tool for estimating the strength of symptom propagation that has minimal data requirements, enabling application across a wide range of pathogens and epidemiological settings.
Polonsky, J.; Hudu, S.; Uthman, K.; Katuala, Y.; Evbuomwan, P. E.; Osman, H. J. O.; Sulaiman, A. K.; Adjaho, I. I.; Doumbia, C. O.; Gignoux, E.; Ale, F.
Show abstract
Background During Nigeria's largest recorded diphtheria outbreak, hospital capacity in Kano State was rapidly overwhelmed. Medecins Sans Frontieres introduced home-based care (HBC) for patients with mild disease to prioritise facility-based care for severe cases. We assessed whether HBC was non-inferior to facility-based treatment in terms of mortality, sequelae, and household transmission. Methods We conducted a retrospective matched cohort study. Mild diphtheria cases treated between January 2023 and May 2024 were matched 1:1 by treatment modality (HBC or diphtheria treatment centre [DTC]) on sex, age group, vaccination status, and residence. Conditional logistic regression estimated the association between treatment modality and mortality, with robustness assessed through propensity score weighting, sensitivity analyses, and E-value computation. Findings Of 990 sampled patients, 678 (367 HBC, 311 DTC) were enrolled (68.5%). After adjustment, treatment modality was not independently associated with mortality (HBC vs. DTC: aOR 0.40, 95% CI 0.13-1.30), with similar estimates across sensitivity analyses (E-value 4.40). Clinical complications were the strongest predictor of death (aOR 23.1, 95% CI 1.73-307). Vaccination was protective (aOR 0.28, 95% CI 0.08-0.94) and treatment delay of four or more days increased mortality (aOR 4.15, 95% CI 1.23-14.0). HBC was not associated with increased household transmission or long-term sequelae. Interpretation Vaccination and early treatment, rather than care setting, were the main determinants of survival. When supported by clinical triage and structured follow-up, decentralised care can be used to manage mild cases during diphtheria epidemics in settings with constrained hospital capacity.
Hudu, S.; Uthman, K.; Katuala, Y.; Bello, I. W.; Mbuyi, Y.; Worku, D. T.; Mbelani, S. C.; Adjaho, I. I.; Gignoux, E.; Doumbia, C. O.; Ale, F.; Polonsky, J.
Show abstract
Background Nigeria has experienced its largest recorded diphtheria outbreak since late 2022, centred on Kano State, where facility-based surveillance documented over 25,000 confirmed cases. The true community burden remains unknown. We conducted a population-based household survey to estimate community attack rates, mortality, vaccination coverage, and determinants of infection and death. Methods We performed a retrospective household survey (September-October 2024) using spatially randomised cluster sampling (65 clusters, ~15 households each; recall period January 2023 to interview). Survey-weighted analyses, multivariable logistic regression, and sensitivity analyses were used. Findings We enrolled 7,998 individuals from 1,068 households. The community attack rate was 1.1% (95% CI 0.7-1.4), 4.2 times (2.7-5.3) higher than facility-based estimates. The case fatality ratio was 8.8% (1.9-15.6) overall and 21.3% among children under five; two thirds of deaths occurred at home. Delayed care-seeking of four or more days was associated with markedly higher mortality (risk ratio 32.6, 95% CI 2.4-450.0). Vaccination was strongly protective against death (vaccine effectiveness 57%, 95% CI 34- 72%; E-value 4.07). Among campaign-eligible children, routine EPI coverage was 56.6%; the reactive campaign reached few previously unvaccinated children (99.7% overlap with prior recipients), leaving 11.6% of eligible children unvaccinated. Interpretation Community diphtheria burden substantially exceeded facility surveillance estimates, with most deaths occurring outside the health system. Delayed care-seeking and low vaccination coverage were the main drivers of mortality, highlighting the need for improved community surveillance, decentralised care, and better-targeted vaccination.
Masegese, T.; MUNG'ONG'O, G. S.; Kamala, B.; Anaeli, A.; Bago, M.; Mtoro, M. J.
Show abstract
Background: HIV/AIDS remains a major public health challenge in Tanzania, where viral load suppression among adults on ART stands at 78% and HVL testing uptake among eligible patients is approximately 22%. Since the introduction of the National HVL Testing Guideline in 2015, little has been done to systematically evaluate its implementation. Objective: To evaluate adherence to the National HVL Testing Guideline across CTC clinics in Dar es Salaam Region, covering ART monitoring, documentation, turnaround time, and factors affecting implementation. Methods: A cross-sectional study was conducted in 2021 across 15 public health facilities with CTC clinics in all five Dar es Salaam districts. A total of 330 PLHIV on ART for more than six months were selected through systematic random sampling with proportional to size allocation, and 45 healthcare providers through convenient sampling. Data were collected via abstraction forms and self-administered questionnaires, and analysed using SPSS Version 23 with descriptive statistics, bivariate analysis, and binary logistic regression. Results: Only 25.1% of patients had their first HVL sample taken at six months as per guideline, with 68.8% delayed beyond six months. Second and third samples were similarly delayed. MoHCDGEC sample tracking forms were absent in 96.7% of facilities and incomplete in 99.1%, and no facility captured specimen acceptance or rejection as site feedback. Turnaround time exceeded the 14-day guideline threshold in 64.5%, 66.7%, and 69.4% of first, second, and third results respectively. Patient negligence (AOR=9.84; 95% CI: 1.83-52.77) and storage (AOR=5.72; 95% CI: 0.94-35.0) were independently associated with guideline adherence. Conclusion: Adherence to the National HVL Testing Guideline in Dar es Salaam is suboptimal across testing timelines, documentation, and turnaround time, with patient negligence and storage capacity as significant determinants. Targeted interventions are needed to strengthen patient education, improve storage infrastructure, enhance documentation systems, and support providers in adhering to guideline-specified timelines.
Xu, J.; Parker, R. M. A.; Bowman, K.; Clayton, G. L.; Lawlor, D. A.
Show abstract
Background Higher levels of sedentary behaviour, such as leisure screen time (LST), and lower levels of physical activity are associated with diseases across multiple body systems which contribute to a large global health burden. Whether these associations are causal is unclear. The primary aim of this study is to investigate the causal effects of higher LST (given greater power) and, secondarily, lower moderate-to-vigorous intensity physical activity (MVPA), on a wide range of diseases in a hypothesis-free approach. Methods A two-sample Mendelian randomisation phenome-wide association study was conducted for the main analyses. Genetic single nucleotide polymorphisms (SNPs) were first selected as exposure genetic instruments for LST (hours of television watched per day; 117 SNPs) and MVPA (higher vs. lower; 18 SNPs) based on the genome-wide significant threshold (p < 5*10-8) from the largest relevant genome-wide association study (GWAS). For disease outcomes, we used summary results from FinnGen GWAS, including 1,719 diseases defined by hospital discharge International Classification of Diseases (ICD) codes in 453,733 European participants. For the main analyses, we used the inverse-variance weighting method with a Bonferroni corrected p-value of p [≤] 3.47*10-4. Sensitivity analyses included Steiger filtering, MR-Egger and weighted median analyses, and data from UK Biobank were used to explore replication. Findings Genetically predicted higher LST was associated with increased risk of 87 (5.1% of the 1,719) diseases. Most of these diseases were in musculoskeletal and connective tissue (n=37), genitourinary (n=12) and respiratory (n=8) systems. Genetic liability to lower MVPA was associated with six diseases: three in musculoskeletal and connective tissue and genitourinary systems (with greater risk of these diseases also identified with higher LST), and three in respiratory and genitourinary systems. Sensitivity analyses largely supported the main analyses. Results replicated in UK Biobank, where data available. Conclusions Higher levels of sedentary behaviour, and lower levels of physical activity, causally increase the risk of diseases across multiple body systems, making them promising targets for reducing multimorbidity.
Bakamutumaho, B.; Lutwama, J. J.; Owor, N.; Lu, X.; Eliku, P. J.; Namulondo, J.; Kayiwa, J.; Ross, J. E.; Nsereko, C.; Nsubuga, J. B.; Shinyale, J.; Asasira, I.; Kiyingi, T.; Reynolds, S. J.; Nie, K.; Kim-Schulze, S.; Cummings, M. J.
Show abstract
ObjectiveBiologically defined sepsis subtypes have been identified in low- and middle-income countries (LMICs), but limited access to molecular diagnostics challenges broader evaluation and implementation in resource-limited settings. We assessed whether models including bedside clinical and rapid microbiologic data could accurately stratify Ugandan adults with sepsis by molecular subtype. DesignSecondary analysis of two prospective observational sepsis cohorts, testing bedside-adaptable classifier models against transcriptomic and proteomic subtype assignments. SettingEntebbe Regional Referral Hospital (urban) and Tororo General Hospital (rural), Uganda. PatientsAdults ([≥]18 years) hospitalized with sepsis, with available transcriptomic (N=355) and/or proteomic (N=495) profiling enabling subtype assignment. InterventionsNone. Measurements and Main ResultsUsing data from two prospective cohorts (RESERVE-U-2-TOR and RESERVE-U-1-EBB), we evaluated bedside-adaptable models against Uganda-derived molecular sepsis subtypes, and, secondarily, against molecular subtypes and axes derived in high-income countries. In RESERVE-U-2-TOR, clinical models including demographics and bedside physiological variables demonstrated moderate discrimination for transcriptomic and proteomic subtype assignment (AUROC 0.75 [95% CI, 0.69-0.81] and 0.73 [0.66-0.80], respectively) with strong calibration (Integrated Calibration Index [Eavg] [≤]0.015 for both models). Adding rapid diagnostic results for HIV, malaria, and tuberculosis produced similar performance (AUROC 0.76 and 0.74; Eavg [≤]0.016). In RESERVE-U-1-EBB, discrimination for clinical and clinico-microbiological models was more variable (AUROC range 0.63-0.75) while calibration remained acceptable (Eavg [≤]0.053). Performance was similar when models were evaluated against molecular sepsis frameworks derived in high-income countries, with acceptable calibration and moderate discrimination. ConclusionsBedside-adaptable clinical models, with or without rapid microbiologic testing, demonstrated acceptable calibration but only modest discrimination for molecular sepsis subtype assignment in Uganda. Expanding laboratory capacity and access to scalable, low-cost molecular biomarker assays will be necessary to advance precision sepsis care in LMIC settings. Key PointsO_ST_ABSQuestionC_ST_ABSAmong adults hospitalized with sepsis in a resource-limited setting, can bedside clinical variables, alone or combined with rapid pathogen diagnostics, accurately stratify molecular sepsis subtype assignments? FindingsIn two prospective Ugandan sepsis cohorts, bedside clinical and clinico-microbiologic models showed robust calibration but only modest discrimination for classifying Uganda-derived transcriptomic and proteomic subtypes. Models also achieved moderate performance for stratifying high-income-country-derived transcriptomic subtypes and immune dysfunction axes, suggesting bedside variables reflect illness severity but incompletely capture underlying molecular signatures. MeaningBedside-adaptable models can support reasonably calibrated risk estimation for molecular sepsis stratification in resource-limited settings but lack sufficient discriminatory power to serve as stand-alone tools. These findings support efforts to improve acute-care laboratory capacity and access to scalable molecular biomarker panels, with the goal of enabling precision sepsis care in low- and middle-income countries.
Shepherd, F.; Slaney, C.; Jones, H. J.; Dardani, C.; Stergiakouli, E.; Sanderson, E. C. M.; Hamilton, F.; Rosoff, D. B.; Rek, N.; Gaunt, T. R.; Davey Smith, G.; Richardson, T. G.; Khandaker, G. M.
Show abstract
Systemic inflammation is implicated in various diseases, yet its upstream determinants remain poorly examined. We conducted a large scale two-sample Mendelian randomisation (MR) study to systematically evaluate the potential causal effects of 3,213 molecular (metabolomic, proteomic), physiological and disease traits on circulating interleukin-6 (IL-6) and C-reactive protein (CRP) levels. Genetic instruments were derived from genome wide association studies and analysed using inverse variance weighted (IVW), weighted median, and MR-Egger methods with multiple testing correction. Bidirectional MR was performed to assess reverse causation. After Bonferroni correction, evidence of potential causal effects was observed for 72 traits on CRP and 9 traits on IL-6. CRP was predominantly influenced by metabolomic traits, especially lipid and fatty acid measures. Genetically proxied adiposity (body mass index and obesity), triglyceride rich lipoproteins, glycoprotein acetyls (GlycA), and apolipoprotein E increased CRP levels, whereas HDL-related cholesterols, polyunsaturated fatty acids, and glutamine decreased CRP. Most associations were consistent across MR methods, supporting the robustness of these results. As expected, IL-6 had a large effect on CRP. IL-6 was influenced by primarily adiposity and HDL-related lipid measures, with generally smaller effect sizes and limited support across sensitivity analyses. Bidirectional analyses indicated little evidence that CRP directly drives metabolic traits when restricting to cis-acting instruments, whereas genetically proxied IL-6 signalling showed consistent downstream effects on HDL particle concentration and composition. Adiposity is a shared upstream determinant of both inflammatory biomarkers, with stronger and broader effects on CRP. These findings suggest that CRP acts as an integrated downstream readout of systemic inflammatory burden, whereas IL-6 reflects a more tightly regulated and context-dependent process. Our work clarifies traits that may causally influence systemic inflammation and highlights biological pathways linking inflammation to cardiometabolic and inflammatory diseases. By mapping upstream determinants of IL-6 and CRP, we also provide a resource to prioritise key drivers for mechanistic study and therapeutic targeting.
Zhai, T.; Babu, M.; Fuentealba, M.; Al Dajani, S.; Gladyshev, V. N.; Furman, D.; Snyder, M.
Show abstract
Quantitative measures for tracking functional health have generally been lacking. Intrinsic capacity (IC) has been proposed as an appropriate measure, but its metrics have been derived in small datasets and sparse longitudinal data. Using harmonized measures of cognition, locomotion, sensory function, vitality, and psychological well-being from 501,615 UK Biobank participants and followed for a median of 15.5 years, we derived domain-specific and composite IC scores. We examined associations with incident disease, cause-specific mortality, multimorbidity, lifestyle and socioeconomic factors, and multi-omic profiles from Olink proteomics, NMR metabolomics, clinical biochemistry, and blood-cell traits. We found that composite IC declined non-linearly with age, and within-person decline was steeper than the cross-sectional age measures. Participants with greater baseline morbidity, those who subsequently developed incident disease, and those who died earlier in follow-up showed lower IC trajectories across adulthood. The IC domains were only modestly correlated with one another, supporting multidimensionality, yet higher overall IC was associated with lower risk of most diseases examined. The dominant IC domain varied by endpoint, with cognition informative for dementia, sensory function for hearing loss, psychological capacity for depression, locomotion for osteoarthritis, and vitality for cardiometabolic outcomes. IC was also associated cross-sectionally with physical activity, insomnia, smoking, medication burden, and socioeconomic disadvantage. More proteins were found predictive for vitality, and enrichment converged on immune/inflammatory and metabolic pathways. Blood-based surrogates recapitulated part of the phenotypic signal, particularly for vitality. Overall, this IC framework captures longitudinal health trajectories and broad disease vulnerability in a large middle- to older-aged cohort and supports IC as a clinically meaningful, multidomain phenotype of aging and identifies blood-based correlates that may facilitate at-scale future monitoring of aging-related function declines.